Sheffield | 25-SDC-Nov | Sheida Shabankari | Sprint 2 | Implement LRU cache in python#105
Sheffield | 25-SDC-Nov | Sheida Shabankari | Sprint 2 | Implement LRU cache in python#105sheida-shab wants to merge 9 commits intoCodeYourFuture:mainfrom
Conversation
There was a problem hiding this comment.
-
Why not simply import or subclass the Double Linked List you implemented in the other exercise (to practice code reuse)?
-
Alternatively, you can explore using
OrderedDictwithinLruCacheto maintain order.
Could you improve your LruCache implementation using one of these approaches?
| raise ValueError("limit must be greater than zero") | ||
| self.limit=limit | ||
| self.map={} | ||
| self.List=LinkedList() |
There was a problem hiding this comment.
Why start the name the property List with an uppercase L?
There was a problem hiding this comment.
I wasn’t paying attention to Python’s naming conventions for properties. I realized that property names should be lowercase, so I’ve renamed List to list to follow the standard convention.
| def push_head(self,node=None,key=None,value=None) -> Node: | ||
| """ | ||
| Adds a node to the head of the list. | ||
| If 'node' is provided, it is moved to the head. |
There was a problem hiding this comment.
Moving a node to the front does not seem like a normal behavior expected from the "push head" operation.
You could consider implementing a helper function or just have the caller explicitly
- remove the node to be moved, and
- add that node to the front of the list
cjyuan
left a comment
There was a problem hiding this comment.
Code works.
Could consider using some tool to auto format the code to add spacing around operators consistently (which improves readability).
| if key in self.map: | ||
| node=self.map[key] | ||
| node.value=value | ||
| self.list.remove(node) | ||
| self.list.push_head(node=node) | ||
| else: | ||
| if self.count>=self.limit: | ||
| old_key,old_value=self.list.pop_tail() | ||
| del self.map[old_key] | ||
| self.count -=1 | ||
|
|
||
| new_node=self.list.push_head(key=key,value=value) | ||
| self.count +=1 | ||
| self.map[key]=new_node | ||
|
|
There was a problem hiding this comment.
It is possible to use the LinkedList you implemented in the "Linked List" exercise (without modification).
We just have to store the key-value pair as a tuple (or any structured data type).
The drawback is, we can't reuse the node when we want to move it to the front.
if key in self.map:
node=self.map[key]
self.list.remove(node)
self.map[key] = self.list.push_head((key, value))
else:
if self.count>=self.limit:
old_key, _ = self.list.pop_tail()
del self.map[old_key]
self.count -=1
self.map[key] = self.list.push_head((key, value))
self.count +=1There was a problem hiding this comment.
Thanks for your useful feedbacks.
PR Summary :
Implement LRU Cache with O(1) get/set operations
Description:
This PR adds a fully functional LRU (Least Recently Used) cache in Python. The implementation includes:
Node class: Represents a key-value pair with pointers to previous and next nodes.
Doubly linked list: Maintains the order of nodes for LRU eviction in O(1) time.
push_head: Moves or adds a node to the head (most recently used).
pop_tail: Removes and returns the tail node (least recently used).
remove: Removes a specific node from the list.
LruCache class: Provides the cache interface.
get(key): Retrieves a value by key and marks the node as most recently used.
set(key, value): Adds or updates a key-value pair; evicts the least recently used node if the cache exceeds its limit.
Handles invalid cache limits by raising ValueError.
Each operation (get and set) has O(1) worst-case time complexity.
Testing:
Unit tests cover cache limit enforcement, eviction order, and get/set functionality.